Semantic Coherence Facilitates Distributional Learning of Word Meanings
نویسندگان
چکیده
Computational work has suggested that one could, in principle, learn aspects of word meaning simply from patterns of co-occurrence between words. The extent to which humans can do this distributional learning is an open question – artificial language learning experiments have yielded mixed results, prompting suggestions that distributional cues must be correlated with other cues, such as phonological regularities, for successful learning. We conducted a large-scale experiment comparing how distributional learning is afforded by two different types of cues – phonological regularities and semantic coherence. We found that semantic coherence more strongly facilitates distributional learning than onset-consonant phonological regularities.
منابع مشابه
Word Type Effects on L2 Word Retrieval and Learning: Homonym versus Synonym Vocabulary Instruction
The purpose of this study was twofold: (a) to assess the retention of two word types (synonyms and homonyms) in the short term memory, and (b) to investigate the effect of these word types on word learning by asking learners to learn their Persian meanings. A total of 73 Iranian language learners studying English translation participated in the study. For the first purpose, 36 freshmen from an ...
متن کاملIs this a Child, a Girl or a Car? Exploring the Contribution of Distributional Similarity to Learning Referential Word Meanings
There has recently been a lot of work trying to use images of referents of words for improving vector space meaning representations derived from text. We investigate the opposite direction, as it were, trying to improve visual word predictors that identify objects in images, by exploiting distributional similarity information during training. We show that for certain words (such as entry-level ...
متن کاملRunning head: SEMANTIC COHERENCE AND DISTRIBUTIONAL LEARNING 1 Semantic coherence facilitates distributional learning
Computational models have shown that purely statistical knowledge about words’ linguistic contexts is sufficient to learn many properties of words, including syntactic and semantic category. For example, models can infer that “postman” and “mailman” are semantically similar because they have quantitatively similar patterns of association with other words (e.g., they both tend to occur with word...
متن کاملWeak semantic context helps phonetic learning in a model of infant language acquisition
Learning phonetic categories is one of the first steps to learning a language, yet is hard to do using only distributional phonetic information. Semantics could potentially be useful, since words with different meanings have distinct phonetics, but it is unclear how many word meanings are known to infants learning phonetic categories. We show that attending to a weaker source of semantics, in t...
متن کاملSemantic Composition via Probabilistic Model Theory
Semantic composition remains an open problem for vector space models of semantics. In this paper, we explain how the probabilistic graphical model used in the framework of Functional Distributional Semantics can be interpreted as a probabilistic version of model theory. Building on this, we explain how various semantic phenomena can be recast in terms of conditional probabilities in the graphic...
متن کامل